Estimation of Linear Vectorial Semiparametric Models by Least Squares
نویسندگان
چکیده
Abstract. Semiparametric model is a statistical model consisting of both parametric and nonparametric components, which can be looked on as a mixture model. The theoretical properties of this model have been studied extensively, such as large-sample property. However, most researches are based on scalar value, in which the dimension of the observation is one at each moment. In the fields of spatial data processing, such as econometrics, GPS, engineering surveying, engineering deformation monitoring, etc, the dimension of the observations is always more than one at each moment, they are vectorial models other than scalar ones. This paper focuses on the estimating theory of vectorial semiparametric models under the least-square principle. We deduced the formulas of weighted function estimator and spline estimator. Kernel and nearest-neighbor are most common weighted functions. In kernel estimation, the weights of observations are determined by kernel functions which are always probability density function, such as Nadaraya-Watson kernel. Nearestneighbor means only the nearest neighbors have effect on a certain observation point. Spline estimation considers the penalized least-squares problem, the criterion function trades off fidelity to the data against function smoothness. The difference between scalar and vectorial semiparametric model was compared. As to weighted function estimator, the difference is a single weight parameter and a weight matrix; as to spline estimator, the difference is the operation of multiplication and Kronecker product.
منابع مشابه
Robust high-dimensional semiparametric regression using optimized differencing method applied to the vitamin B2 production data
Background and purpose: By evolving science, knowledge, and technology, we deal with high-dimensional data in which the number of predictors may considerably exceed the sample size. The main problems with high-dimensional data are the estimation of the coefficients and interpretation. For high-dimension problems, classical methods are not reliable because of a large number of predictor variable...
متن کاملConsistent Covariate Selection and Post Model Selection Inference in Semiparametric Regression
This paper presents a model selection technique of estimation in semiparametric regression models of the type Yi = β ′Xi + f(Ti) + Wi, i= 1, . . . , n. The parametric and nonparametric components are estimated simultaneously by this procedure. Estimation is based on a collection of finite-dimensional models, using a penalized least squares criterion for selection. We show that by tailoring the ...
متن کاملSemiparametric Least Squares Estimation of Monotone Single Index Models and Its Application to the Itelative Least Squares Estimation of Binary Choice Models
The Semiparametric Least Squares (SLS) estimation for single index models is studied. Applying the isometric regression by Ayer et al (1955), the method minimizes the mean squared errors with respect to both finite and infinite dimensional parameters. A proof of consistency and an upper bound of convergence rates is offered. As an application example of the SLS estimation, asymptotic normality ...
متن کاملEstimation in linear regression models with measurement errors subject to single-indexed distortion
In this paper, we consider statistical inference for linear regression models when neither the response nor the predictors can be directly observed, but are measured with errors in a multiplicative fashion and distorted as single index models of observable confounding variables. We propose a semiparametric profile least squares estimation procedure to estimate the single index. Then we develop ...
متن کاملVariable Selection in Nonparametric and Semiparametric Regression Models
This chapter reviews the literature on variable selection in nonparametric and semiparametric regression models via shrinkage. We highlight recent developments on simultaneous variable selection and estimation through the methods of least absolute shrinkage and selection operator (Lasso), smoothly clipped absolute deviation (SCAD) or their variants, but restrict our attention to nonparametric a...
متن کامل